298 research outputs found

    Ecological restoration as a real-world experiment: designing robust implementation strategies in an urban environment

    Full text link
    The concept of real-world experiments is a framework to understand environmental design projects under real world conditions. Contrary to laboratory experiments that are generally thought to exclude the public, real-world experiments involve combinations of social and natural factors. In this paper the theory of real-world experiments is applied to the fieldwork of ecological restoration. The case discussed here is an ecological design process at Montrose Point, a peninsula built on landfill in Lake Michigan on the North Side of Chicago. It illustrates how, in the practice of ecological restoration, the idea of experiment can be understood as being built on processes of recursive learning that include different parts of the wider society and nature. The paper outlines a concept of robust implementation strategies in which public involvement is a pivotal part of a more encompassing activity of ecological practice. This is undertaken to aim at a better understanding of learning processes taking place in natural and social systems

    On the Effect of Inband Signaling and Realistic Channel Knowledge on Dynamic OFDM-FDMA Systems

    Get PDF
    Dynamically assigning subcarriers of OFDM systems to multiple different terminals in a cell has been shown to be beneficial in terms of different transmission metrics. However, the success of such a scheme depends on the ability of the access point to inform terminals of their newest subcarrier assignments as well as on the accuracy of the channel state information used to generate new assignments. It is not clear whether the overhead required to implement these two system abilities consumes all of the potential performance increase possible by dynamically assigning subcarriers

    Chinese and American Perceptions on Nonprofit Organizational Effectiveness

    Get PDF
    Dynamically assigning sub-carriers of orthogonal frequency division multiplexing (OFDM) systems to multiple different terminals in a cell has been shown to be beneficial in terms of different transmission metrics. The success of such a scheme, however depends on the ability of the access point to inform terminals of their newest sub-carrier assignments as well as on the accuracy of the channel state information used to generate new assignments. It is not clear whether the overhead required to implement these two functions consumes all of the potential performance increase possible by dynamically assigning subcarriers. In this paper, a specific MAC structure is selected enabling the operation of a dynamic OFDM system, incorporating a signalling scheme for dynamically assigned sub-carriers. Based on this structure, we study the overhead required for a dynamic sub-carrier scheme; a static variant that serves as a comparison case. We investigate the performance difference of these two schemes for various scenarios where at first signalling and then realistic channel knowledge is added to the system model. Average throughput and goodput per terminal serve as figures of merit; the number of terminals in the cell, the transmission power per sub-carrier, the delay spread and the movement speed of the terminals are varied. We find that a realistic overhead model decreases the performance of both static and dynamic schemes such that the overall ratio favours in all cases except for higher speeds the dynamic rather than the static scheme especially in realistic system environments.QC 20131129</p

    Optimized Transformation and Verification of SystemC Methods

    Get PDF
    Concurrent designs can be automatically verified by transforming them into an automata-based representation and by model checking the resulting model. However, when transforming a concurrent design into an automata-based representation, each method has to be translated into a single automaton. This produces a significant overhead for model checking. In this paper, we present an optimization of our previously proposed transformation from SystemC into Uppaal timed automata. The main idea is that we analyze whether SystemC methods can be executed atomically and then we use the results for generating a reduced automata model. We have implemented the optimized transformation in ourSystemC to Timed Automata Transformation Engine (STATE) and demonstrate the effect of our optimization with experimental results from micro benchmarks, a simple producer-consumer example, and from an Anti-Slip Regulation and Anti-lock Braking System (ASR/ABS)

    Retrieval of ice-nucleating particle concentrations from lidar observations and comparison with UAV in situ measurements

    Get PDF
    Aerosols that are efficient ice-nucleating particles (INPs) are crucial for the formation of cloud ice via heterogeneous nucleation in the atmosphere. The distribution of INPs on a large spatial scale and as a function of height determines their impact on clouds and climate. However, in situ measurements of INPs provide sparse coverage over space and time. A promising approach to address this gap is to retrieve INP concentration profiles by combining particle concentration profiles derived by lidar measurements with INP efficiency parameterizations for different freezing mechanisms (immersion freezing, deposition nucleation). Here, we assess the feasibility of this new method for both ground-based and spaceborne lidar measurements, using in situ observations collected with unmanned aerial vehicles (UAVs) and subsequently analyzed with the FRIDGE (FRankfurt Ice nucleation Deposition freezinG Experiment) INP counter from an experimental campaign at Cyprus in April 2016. Analyzing five case studies we calculated the cloud-relevant particle number concentrations using lidar measurements (n250,dry with an uncertainty of 20 % to 40 % and Sdry with an uncertainty of 30 % to 50 %), and we assessed the suitability of the different INP parameterizations with respect to the temperature range and the type of particles considered. Specifically, our analysis suggests that our calculations using the parameterization of Ullrich et al. (2017) (applicable for the temperature range −50 to −33 ∘C) agree within 1 order of magnitude with the in situ observations of nINP; thus, the parameterization of Ullrich et al. (2017) can efficiently address the deposition nucleation pathway in dust-dominated environments. Additionally, our calculations using the combination of the parameterizations of DeMott et al. (2015, 2010) (applicable for the temperature range −35 to −9 ∘C) agree within 2 orders of magnitude with the in situ observations of INP concentrations (nINP) and can thus efficiently address the immersion/condensation pathway of dust and nondust particles. The same conclusion is derived from the compilation of the parameterizations of DeMott et al. (2015) for dust and Ullrich et al. (2017) for soot.Peer reviewe

    Inefficacy of different strategies to improve guideline awareness – 5-year follow-up of the hypertension evaluation project (HEP)

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>In spite of numerous guidelines for evidence based diagnostic and therapy adequate knowledge of current recommendations is disappointingly low. In the Hypertension Evaluation Project (HEP I) we showed that awareness of national hypertension guidelines under German practitioners was less than 25% in the year 2000. This indicates the need for efficient strategies to relevantly improve guideline awareness.</p> <p>Methods</p> <p>To asses different tools for amending guideline knowledge we used three strategies (guideline in print, interactive guideline, expert seminars) to train 8325 randomised physicians, who had participated in the HEP I trial. Guideline knowledge of the trained physicians was again tested with the HEP questionnaire and compared to a control group of HEP I physicians.</p> <p>Results</p> <p>The return rate of questionnaires was 57.9% without a significant distinction between the groups. Overall guideline awareness was still low but remarkably improved compared to the results of HEP I (37.1% vs. 23.7%, p < 0.0001). There was no difference between the trained physicians and the control group (35.8% and 35.9% vs. 39.7%, p = n.s.).</p> <p>Conclusion</p> <p>We investigated the influence of different strategies to improve guideline awareness among German physicians. None of our interventions (guideline in print, interactive guideline, expert seminars) brought a notable benefit compared to control group. However, overall knowledge of guideline contents increased from 23.7% to 37.1% over five years. Therefore, other probably multimodal interventions are necessary to significantly improve guideline awareness beyond spontaneous advancement.</p> <p>Trial Registration</p> <p>ISRCTN53383289</p

    Lumpability Abstractions of Rule-based Systems

    Get PDF
    The induction of a signaling pathway is characterized by transient complex formation and mutual posttranslational modification of proteins. To faithfully capture this combinatorial process in a mathematical model is an important challenge in systems biology. Exploiting the limited context on which most binding and modification events are conditioned, attempts have been made to reduce the combinatorial complexity by quotienting the reachable set of molecular species, into species aggregates while preserving the deterministic semantics of the thermodynamic limit. Recently we proposed a quotienting that also preserves the stochastic semantics and that is complete in the sense that the semantics of individual species can be recovered from the aggregate semantics. In this paper we prove that this quotienting yields a sufficient condition for weak lumpability and that it gives rise to a backward Markov bisimulation between the original and aggregated transition system. We illustrate the framework on a case study of the EGF/insulin receptor crosstalk.Comment: In Proceedings MeCBIC 2010, arXiv:1011.005

    Zero modes, beta functions and IR/UV interplay in higher-loop QED

    Get PDF
    We analyze the relation between the short-distance behavior of quantum field theory and the strong-field limit of the background field formalism, for QED effective Lagrangians in self-dual backgrounds, at both one and two loop. The self-duality of the background leads to zero modes in the case of spinor QED, and these zero modes must be taken into account before comparing the perturbative beta function coefficients and the coefficients of the strong-field limit of the effective Lagrangian. At one-loop this is familiar from instanton physics, but we find that at two-loop the role of the zero modes, and the interplay between IR and UV effects in the renormalization, is quite different. Our analysis is motivated in part by the remarkable simplicity of the two-loop QED effective Lagrangians for a self-dual constant background, and we also present here a new independent derivation of these two-loop results.Comment: 15 pages, revtex

    Integration of Genome-Wide SNP Data and Gene-Expression Profiles Reveals Six Novel Loci and Regulatory Mechanisms for Amino Acids and Acylcarnitines in Whole Blood

    Get PDF
    Profiling amino acids and acylcarnitines in whole blood spots is a powerful tool in the laboratory diagnosis of several inborn errors of metabolism. Emerging data suggests that altered blood levels of amino acids and acylcarnitines are also associated with common metabolic diseases in adults. Thus, the identification of common genetic determinants for blood metabolites might shed light on pathways contributing to human physiology and common diseases. We applied a targeted mass-spectrometry-based method to analyze whole blood concentrations of 96 amino acids, acylcarnitines and pathway associated metabolite ratios in a Central European cohort of 2, 107 adults and performed genome-wide association (GWA) to identify genetic modifiers of metabolite concentrations. We discovered and replicated six novel loci associated with blood levels of total acylcarnitine, arginine (both on chromosome 6;rs12210538, rs17657775),propionylcarnitine (chromosome 10;rs12779637),2-hydroxyisovalerylcarnitine (chromosome 21;rs1571700),stearoylcarnitine (chromosome 1;rs3811444),and aspartic acid traits (chromosome 8;rs750472). Based on an integrative analysis of expression quantitative trait loci in blood mononuclear cells and correlations between gene expressions and metabolite levels, we provide evidence for putative causative genes: SLC22A16 for total acylcarnitines, ARG1 for arginine, HLCS for 2-hydroxyisovalerylcarnitine, JAM3 for stearoylcarnitine via a trans-effect at chromosome 1, and PPP1R16A for aspartic acid traits. Further, we report replication and provide additional functional evidence for ten loci that have previously been published for metabolites measured in plasma, serum or urine. In conclusion, our integrative analysis of SNP, gene-expression and metabolite data points to novel genetic factors that may be involved in the regulation of human metabolism. At several loci, we provide evidence for metabolite regulation via gene-expression and observed overlaps with GWAS loci for common diseases. These results form a strong rationale for subsequent functional and disease-related studies
    • …
    corecore